30
|
2 Fundamentals of Information, Signal and System Theory
time t/s
time t/s
Fig. 2.17: Examples of signals s1(t) to s3(t) originate from a deterministic (exactly predetermined)
process and can be completely described analytically as a polynomial or harmonic function (left).
The random signal s4(t), on the other hand, comes from a random process and cannot be fully ex-
pressed by analytical functions (right).
the random variable is most likely to take. If xi is a real discrete random variable with
the values (xi)i∈ℕand with the respective probabilities (pi)i∈ℕ(with ℕas the set of
natural numbers), the expected value (1. moment) of the time series is given by:
E(X) = μX = ∑
i∈ℕ
xipi = ∑
i∈ℕ
xiP(X = xi) .
(2.39)
The expectation E(X) is thus the weighted mean μX of X weighted by the probabilit-
ies P and thus the most probable value for a realisation of X (1. moment). With equal
probability of N realisations pi = p = 1/N, the expected value is equal to the mean
μX = μ of X. For integrable expected values, i.e. E(X) = μX < ∞, the second moment
or variance is the expected value of the squared deviation of the random variable X
from the mean μX.
Var(X) = E((X −μ)2) = ∑
i∈ℕ
(xi −μX)2P(X = xi) .
(2.40)
The variance is a quadratic quantity that gives the mean squared deviation of a ran-
dom variable from the expected value of X. It is thus the expected value of the squared
deviation (2. moment). The associated non-squared quantity is the standard deviation
σX, which is defined as the square root of the variance:
σ(X) = √var(X) .
(2.41)
Both the variance and the standard deviation are positive quantities. Thus, Var(∙) ≥0
and σ(∙) ≥0 hold.